Semantic Segmentation of Urban Scenes Using Dense Depth Maps

نویسندگان

  • Chenxi Zhang
  • Liang Wang
  • Ruigang Yang
چکیده

In this paper we present a framework for semantic scene parsing and object recognition based on dense depth maps. Five viewindependent 3D features that vary with object class are extracted from dense depth maps at a superpixel level for training a classifier using randomized decision forest technique. Our formulation integrates multiple features in a Markov Random Field (MRF) framework to segment and recognize different object classes in query street scene images. We evaluate our method both quantitatively and qualitatively on the challenging Cambridge-driving Labeled Video Database (CamVid). The result shows that only using dense depth information, we can achieve overall better accurate segmentation and recognition than that from sparse 3D features or appearance, or even the combination of sparse 3D features and appearance, advancing state-of-the-art performance. Furthermore, by aligning 3D dense depth based features into a unified coordinate frame, our algorithm can handle the special case of view changes between training and testing scenarios. Preliminary evaluation in cross training and testing shows promising results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

3DFS: Deformable Dense Depth Fusion and Segmentation for Object Reconstruction from a Handheld Camera

We propose an approach for 3D reconstruction and segmentation of a single object placed on a flat surface from an input video. Our approach is to perform dense depth map estimation for multiple views using a proposed objective function that preserves detail. The resulting depth maps are then fused using a proposed implicit surface function that is robust to estimation error, producing a smooth ...

متن کامل

Improving Depth Maps by Nonlinear Diffusion

Dense depth maps, typically produced by stereo algorithms, are essential for various computer vision applications. For general configurations in which the cameras are not necessarily parallel or close together, it often proves difficult to obtain reasonable results for complex scenes, in particular in occluded or textureless regions. To improve the depth map in such regions, we propose a post-p...

متن کامل

Efficient edge-aware surface mesh reconstruction for urban scenes

We propose an efficient approach for building compact, edge-preserving, view-centric triangle meshes from either dense or sparse depth data, with a focus on modeling architecture in large-scale urban scenes. Our method constructs a 2D base mesh from a preliminary view partitioning, then lifts the base mesh into 3D in a fast vertex depth optimization. Different view partitioning schemes are prop...

متن کامل

3D Reconstruction of Dynamic Scenes with Multiple Handheld Cameras

Accurate dense 3D reconstruction of dynamic scenes from natural images is still very challenging. Most previous methods rely on a large number of fixed cameras to obtain good results. Some of these methods further require separation of static and dynamic points, which are usually restricted to scenes with known background. We propose a novel dense depth estimation method which can automatically...

متن کامل

Real-time Progressive 3D Semantic Segmentation for Indoor Scene

The widespread adoption of autonomous systems such as drones and assistant robots has created a need for real-time high-quality semantic scene segmentation. In this paper, we propose an efficient yet robust technique for on-the-fly dense reconstruction and semantic segmentation of 3D indoor scenes. To guarantee real-time performance, our method is built atop small clusters of voxels and a condi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010